
Following the viral trend of creating dolls using ChatGPT, experts are now raising serious concerns about privacy risks.
To join the trend, users sent a photo of themselves to ChatGPT along with personal information such as hobbies, which would be included as part of the toy’s packaging design.
Sounds pretty harmless, right? However, Tom Vazdar, head of cybersecurity at the Open Institute of Technology, revealed some of the dangers of sharing personal data with AI.
According to him, every time you share a photo with ChatGPT, it becomes part of the AI’s data ecosystem. “This includes EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was taken,” Vazdar explained to Wired.
“Since platforms like ChatGPT operate in a conversational manner, behavioral data is also collected — such as what you typed, what kind of images you requested, how you interacted with the interface, and how often you did so.”
So essentially, when you send a photo of your face, you’re not only giving the AI access to your facial data but also to what’s in the background — such as your location or other people who might appear in the image.
“This trend, whether by design or by opportunistic chance, is providing the company with large volumes of new, high-quality facial data from diverse age groups, ethnicities, and geographic regions,” Vazdar added.
Photo and video: Instagram @galisteuoficial. This content was created with the help of AI and reviewed by the editorial team.
